A proximal multiplier method for separable convex minimization
نویسندگان
چکیده
In this paper, we propose an inexact proximal multiplier method using proximal distances for solving convex minimization problems with a separable structure. The proposed method unified the work of Chen and Teboulle (PCPM method), Kyono and Fukushima (NPCPMM) and Auslender and Teboulle (EPDM) and extends the convergence properties for the class of φ−divergence distances. We prove, under standard assumptions, that the iterations generated by the method are well defined and the sequence converges to an optimal solution of the problem.
منابع مشابه
Convergence rate of a proximal multiplier algorithm for separable convex minimization
This paper analyze the convergence rate of a proximal algorithm called Proximal Multiplier Algorithm with Proximal Distances (PMAPD), proposed by us in [20], to solve convex minimization problems with separable structure. We prove that, under mild assumptions, its primal-dual sequences converge linearly to the optimal solution for a class of proximal distances .
متن کاملA note on the ergodic convergence of symmetric alternating proximal gradient method
We consider the alternating proximal gradient method (APGM) proposed to solve a convex minimization model with linear constraints and separable objective function which is the sum of two functions without coupled variables. Inspired by Peaceman-Rachford splitting method (PRSM), a nature idea is to extend APGM to the symmetric alternating proximal gradient method (SAPGM), which can be viewed as ...
متن کاملImproving an ADMM-like Splitting Method via Positive-Indefinite Proximal Regularization for Three-Block Separable Convex Minimization
Abstract. The augmented Lagrangian method (ALM) is fundamental for solving convex minimization models with linear constraints. When the objective function is separable such that it can be represented as the sum of more than one function without coupled variables, various splitting versions of the ALM have been well studied in the literature such as the alternating direction method of multiplier...
متن کاملOn the Proximal Jacobian Decomposition of ALM for Multiple-Block Separable Convex Minimization Problems and Its Relationship to ADMM
The augmented Lagrangian method (ALM) is a benchmark for solving convex minimization problems with linear constraints. When the objective function of the model under consideration is representable as the sum of some functions without coupled variables, a Jacobian or Gauss-Seidel decomposition is often implemented to decompose the ALM subproblems so that the functions’ properties could be used m...
متن کاملPartial Proximal Minimization Algorithms for Convex Programming * Dimitri
An extension of the proximal minimization algorithm is considered where only some of the minimization variables appear in the quadratic proximal term. The resulting iterates are interpreted in terms of the iterates of the standard algorithm, and a uniform descent property is shown that holds independently of the proximal terms used. This property is used to give simple convergence proofs of par...
متن کامل